14. Apply Bayes Rule with Additional Conditions
Apply Bayes Rule With Additional Conditions

We aim to estimate state beliefs without the need to carry our entire observation history. We will accomplish this by manipulating our posterior , obtaining a recursive state estimator. For this to work, we must demonstrate that our current belief can be expressed by the belief one step earlier , then use new data to update only the current belief. This recursive filter is known as the Bayes Localization filter or Markov Localization, and enables us to avoid carrying historical observation and motion data. We will achieve this recursive state estimator using Bayes Rule, the Law of Total Probability, and the Markov Assumption.

We take the first step towards our recursive structure by splitting our observation vector into current observations and previous information . The posterior can then be rewritten as .

Now, we apply Bayes' rule, with an additional challenge, the presence of multiple distributions on the right side (likelihood, prior, normalizing constant). How can we best handle multiple conditions within Bayes Rule? As a hint, we can use substitution, where is a, and the observation vector at time t, is b. Don’t forget to include and as well.
Bayes Rule
Quiz
Please apply Bayes Rule to determine the right side of Bayes rule, where the posterior, , is
(A)
(B)
(C)
Apply Bayes Rules with Additional Conditions